|
In probability theory, the Chernoff bound, named after Herman Chernoff but due to Herman Rubin, gives exponentially decreasing bounds on tail distributions of sums of independent random variables. It is a sharper bound than the known first or second moment based tail bounds such as Markov's inequality or Chebyshev inequality, which only yield power-law bounds on tail decay. However, the Chernoff bound requires that the variates be independent – a condition that neither the Markov nor the Chebyshev inequalities require. It is related to the (historically prior) Bernstein inequalities, and to Hoeffding's inequality. ==Example== Let be independent Bernoulli random variables, each having probability ''p'' > 1/2 of being equal to 1. Then the probability of simultaneous occurrence of more than ''n''/2 of the events has an exact value , where : The Chernoff bound shows that has the following lower bound: : Indeed, noticing that , we get by the multiplicative form of Chernoff bound (see below or Corollary 13.3 in Sinclair's class notes), : This result admits various generalizations as outlined below. One can encounter many flavours of Chernoff bounds: the original ''additive form'' (which gives a bound on the absolute error) or the more practical ''multiplicative form'' (which bounds the error relative to the mean). 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Chernoff bound」の詳細全文を読む スポンサード リンク
|